5 research outputs found

    Enhancing Human-Robot Collaboration Transportation through Obstacle-Aware Vibrotactile Feedback

    Full text link
    Transporting large and heavy objects can benefit from Human-Robot Collaboration (HRC), increasing the contribution of robots to our daily tasks and reducing the risk of injuries to the human operator. This approach usually posits the human collaborator as the leader, while the robot has the follower role. Hence, it is essential for the leader to be aware of the environmental situation. However, when transporting a large object, the operator's situational awareness can be compromised as the object may occlude different parts of the environment. This paper proposes a novel haptic-based environmental awareness module for a collaborative transportation framework that informs the human operator about surrounding obstacles. The robot uses two LIDARs to detect the obstacles in the surroundings. The warning module alerts the operator through a haptic belt with four vibrotactile devices that provide feedback about the location and proximity of the obstacles. By enhancing the operator's awareness of the surroundings, the proposed module improves the safety of the human-robot team in co-carrying scenarios by preventing collisions. Experiments with two non-expert subjects in two different situations are conducted. The results show that the human partner can successfully lead the co-transportation system in an unknown environment with hidden obstacles thanks to the haptic feedback.Comment: 6 pages, 5 figures, for associated video, see this https://youtu.be/UABeGPIIrH

    A Novel Haptic Feature Set for the Classification of Interactive Motor Behaviors in Collaborative Object Transfer

    Get PDF
    Haptics provides a natural and intuitive channel of communication during the interaction of two humans in complex physical tasks, such as joint object transportation. However, despite the utmost importance of touch in physical interactions, the use of haptics is underrepresented when developing intelligent systems. This study explores the prominence of haptic data to extract information about underlying interaction patterns within human-human cooperation. For this purpose, we design salient haptic features describing the collaboration quality within a physical dyadic task and investigate the use of these features to classify the interaction patterns. We categorize the interaction into four discrete behavior classes. These classes describe whether the partners work in harmony or face conflicts while jointly transporting an object through translational or rotational movements. We test the proposed features on a physical human-human interaction (pHHI) dataset, consisting of data collected from 12 human dyads. Using these data, we verify the salience of haptic features by achieving a correct classification rate over 91% using a Random Forest classifier

    Robot-Assisted Navigation for Visually Impaired through Adaptive Impedance and Path Planning

    Full text link
    This paper presents a framework to navigate visually impaired people through unfamiliar environments by means of a mobile manipulator. The Human-Robot system consists of three key components: a mobile base, a robotic arm, and the human subject who gets guided by the robotic arm via physically coupling their hand with the cobot's end-effector. These components, receiving a goal from the user, traverse a collision-free set of waypoints in a coordinated manner, while avoiding static and dynamic obstacles through an obstacle avoidance unit and a novel human guidance planner. With this aim, we also present a legs tracking algorithm that utilizes 2D LiDAR sensors integrated into the mobile base to monitor the human pose. Additionally, we introduce an adaptive pulling planner responsible for guiding the individual back to the intended path if they veer off course. This is achieved by establishing a target arm end-effector position and dynamically adjusting the impedance parameters in real-time through a impedance tuning unit. To validate the framework we present a set of experiments both in laboratory settings with 12 healthy blindfolded subjects and a proof-of-concept demonstration in a real-world scenario.Comment: 7 pages, 7 figures, submitted to IEEE International Conference on Robotics and Automation, for associated video, see https://youtu.be/B94n3QjdnJ

    A variable-fractional order admittance controller for pHRI

    Get PDF
    In today’s automation driven manufacturing environments, emerging technologies like cobots (collaborative robots) and augmented reality interfaces can help integrating humans into the production workflow to benefit from their adaptability and cognitive skills. In such settings, humans are expected to work with robots side by side and physically interact with them. However, the trade-off between stability and transparency is a core challenge in the presence of physical human robot interaction (pHRI). While stability is of utmost importance for safety, transparency is required for fully exploiting the precision and ability of robots in handling labor intensive tasks. In this work, we propose a new variable admittance controller based on fractional order control to handle this trade-off more effectively. We compared the performance of fractional order variable admittance controller with a classical admittance controller with fixed parameters as a baseline and an integer order variable admittance controller during a realistic drilling task. Our comparisons indicate that the proposed controller led to a more transparent interaction compared to the other controllers without sacrificing the stability. We also demonstrate a use case for an augmented reality (AR) headset which can augment human sensory capabilities for reaching a certain drilling depth otherwise not possible without changing the role of the robot as the decision maker
    corecore